Rényi's divergence and entropy rates for finite alphabet Markov sources

نویسندگان

  • Ziad Rached
  • Fady Alajaji
  • L. Lorne Campbell
چکیده

In this work, we examine the existence and the computation of the Rényi divergence rate, lim ( ), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. This yields a generalization of a result of Nemetz where he assumed that the initial probabilities under and are strictly positive. The main tools used to obtain the Rényi divergence rate are the theory of nonnegative matrices and Perron–Frobenius theory. We also provide numerical examples and investigate the limits of the Rényi divergence rate as 1 and as 0. Similarly, we provide a formula for the Rényi entropy rate lim ( ) of Markov sources and examine its limits as 1 and as 0. Finally, we briefly provide an application to source coding.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A measure of relative entropy between individual sequences with application to universal classification

A new notion of empirical informational divergence (relative entropy) between two individual sequences is introduced. If the two sequences are independent realizations of two finiteorder, finite alphabet, stationary Markov processes, the empirical relative entropy converges to the relative entropy almost surely. This new empirical divergence is based on a version of the Lempel-Ziv data compress...

متن کامل

The Relative Entropy Rate For Two Hidden Markov Processes

The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...

متن کامل

There is no universal source code for an infinite source alphabet

We show that a discrete infinite distribution with finite entropy cannot be estimated consistently in information divergence. As a corollary we get that there is no universal source code for an infinite source alphabet over the class of all discrete memoryless sources with finite entropy.

متن کامل

Permutation Complexity and Coupling Measures in Hidden Markov Models

In [Haruna, T. and Nakajima, K., 2011. Physica D 240, 13701377], the authors introduced the duality between values (words) and orderings (permutations) as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutation analogues. It has been used to give a simple proof of the equality between the entropy rate a...

متن کامل

The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems

Permutation entropy quantifies the diversity of possible orderings of the values a random or deterministic system can take, as Shannon entropy quantifies the diversity of values. We show that the metric and permutation entropy rates—measures of new disorder per new observed value—are equal for ergodic finite-alphabet information sources (discrete-time stationary stochastic processes). With this...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 47  شماره 

صفحات  -

تاریخ انتشار 2001